7,404 research outputs found
Geometrically Intrinsic Nonlinear Recursive Filters I: Algorithms
The Geometrically Intrinsic Nonlinear Recursive Filter, or GI Filter, is
designed to estimate an arbitrary continuous-time Markov diffusion process X
subject to nonlinear discrete-time observations. The GI Filter is fundamentally
different from the much-used Extended Kalman Filter (EKF), and its second-order
variants, even in the simplest nonlinear case, in that: (i) It uses a quadratic
function of a vector observation to update the state, instead of the linear
function used by the EKF. (ii) It is based on deeper geometric principles,
which make the GI Filter coordinate-invariant. This implies, for example, that
if a linear system were subjected to a nonlinear transformation f of the
state-space and analyzed using the GI Filter, the resulting state estimates and
conditional variances would be the push-forward under f of the Kalman Filter
estimates for the untransformed system - a property which is not shared by the
EKF or its second-order variants.
The noise covariance of X and the observation covariance themselves induce
geometries on state space and observation space, respectively, and associated
canonical connections. A sequel to this paper develops stochastic differential
geometry results - based on "intrinsic location parameters", a notion derived
from the heat flow of harmonic mappings - from which we derive the
coordinate-free filter update formula. The present article presents the
algorithm with reference to a specific example - the problem of tracking and
intercepting a target, using sensors based on a moving missile. Computational
experiments show that, when the observation function is highly nonlinear, there
exist choices of the noise parameters at which the GI Filter significantly
outperforms the EKF.Comment: 22 pages, 4 figure
Differential equation approximations for Markov chains
We formulate some simple conditions under which a Markov chain may be
approximated by the solution to a differential equation, with quantifiable
error probabilities. The role of a choice of coordinate functions for the
Markov chain is emphasised. The general theory is illustrated in three
examples: the classical stochastic epidemic, a population process model with
fast and slow variables, and core-finding algorithms for large random
hypergraphs.Comment: Published in at http://dx.doi.org/10.1214/07-PS121 the Probability
Surveys (http://www.i-journals.org/ps/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Structure of large random hypergraphs
The theme of this paper is the derivation of analytic formulae for certain
large combinatorial structures. The formulae are obtained via fluid limits of
pure jump type Markov processes, established under simple conditions on the
Laplace transforms of their Levy kernels. Furthermore, a related Gaussian
approximation allows us to describe the randomness which may persist in the
limit when certain parameters take critical values. Our method is quite
general, but is applied here to vertex identifiability in random hypergraphs. A
vertex v is identifiable in n steps if there is a hyperedge containing v all of
whose other vertices are identifiable in fewer than n steps. We say that a
hyperedge is identifiable if every one of its vertices is identifiable. Our
analytic formulae describe the asymptotics of the number of identifiable
vertices and the number of identifiable hyperedges for a Poisson random
hypergraph on a set of N vertices, in the limit as N goes to infinity.Comment: Revised version with minor conceptual improvements and additional
discussion. 32 pages, 5 figure
- …